Context-aware Synthesis for Video Frame Interpolation

نویسندگان

  • Simon Niklaus
  • Feng Liu
چکیده

Video frame interpolation algorithms typically estimate optical flow or its variations and then use it to guide the synthesis of an intermediate frame between two consecutive original frames. To handle challenges like occlusion, bidirectional flow between the two input frames is often estimated and used to warp and blend the input frames. However, how to effectively blend the two warped frames still remains a challenging problem. This paper presents a context-aware synthesis approach that warps not only the input frames but also their pixel-wise contextual information and uses them to interpolate a high-quality intermediate frame. Specifically, we first use a pre-trained neural network to extract per-pixel contextual information for input frames. We then employ a state-of-the-art optical flow algorithm to estimate bidirectional flow between them and prewarp both input frames and their context maps. Finally, unlike common approaches that blend the pre-warped frames, our method feeds them and their context maps to a video frame synthesis neural network to produce the interpolated frame in a context-aware fashion. Our neural network is fully convolutional and is trained end to end. Our experiments show that our method can handle challenging scenarios such as occlusion and large motion and outperforms representative state-of-the-art approaches.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A Temporally-Aware Interpolation Network for Video Frame Inpainting

We propose the first deep learning solution to video frame inpainting, a challenging instance of the general video inpainting problem with applications in video editing, manipulation, and forensics. Our task is less ambiguous than frame interpolation and video prediction because we have access to both the temporal context and a partial glimpse of the future, allowing us to better evaluate the q...

متن کامل

Occlusion-aware temporal frame interpolation in a highly scalable video coding setting

We recently proposed a bidirectional hierarchical anchoring (BIHA) of motion fields for highly scalable video coding. The BIHA scheme employs piecewise-smooth motion fields, and uses breakpoints to signal motion discontinuities. In this paper, we show how the fundamental building block of the BIHA scheme can be used to perform bidirectional, occlusion-aware temporal frame interpolation (BOA-TFI...

متن کامل

TCP-friendly Internet video streaming employing variable frame-rate encoding and interpolation

A feedback-based Internet video transmission scheme based on the ITU-T H.263+ is presented. The proposed system is capable of continually accommodating its stream size and managing the packet loss recovery in response to network condition changes. It consists of multiple components: TCP-friendly end-to-end congestion control and available bandwidth estimation, encoding frame-rate control and de...

متن کامل

Occlusion-Aware Temporal Frame Interpolation in a Highly Scalable Video Coding Setting

We recently proposed a bidirectional hierarchical anchoring (BIHA) of motion fields for highly scalable video coding. The BIHA scheme employs piecewise-smooth motion fields, and uses breakpoints to signal motion discontinuities. In this paper, we show how the fundamental building block of the BIHA scheme can be used to perform bidirectional, occlusionaware temporal frame interpolation (BOA-TFI)...

متن کامل

Overview of View Synthesis Prediction for Multi-view Video Coding

With a wide range of viewing angles, the multi-view video can provide more realistic feeling at arbitrary viewpoints. However, because of a huge amount of data from multiple cameras, we need to develop an efficient coding method. One of the promising approaches for multi-view video coding is view synthesis prediction which generates an additional reference frame. In this paper, we explain the p...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2018